On the Insufficiency of Existing Momentum Schemes for Stochastic Optimization
ثبت نشده
چکیده
Momentum based stochastic gradient methods such as heavy ball (HB) and Nesterov’s accelerated gradient descent (NAG) method are widely used in practice for training deep networks and other supervised learning models, as they often provide significant improvements over stochastic gradient descent (SGD). In general, “fast gradient” methods have provable improvements over gradient descent only for the deterministic case, where the gradients are exact. In the stochastic case, the popular explanations for their wide applicability is that when these fast gradient methods are applied in the stochastic case, they partially mimic their exact gradient counterparts, resulting in some practical gain. This work provides a counterpoint to this belief by proving that there are simple problem instances where these methods cannot outperform SGD despite the best setting of its parameters. These negative problem instances are, in an informal sense, generic; they do not look like carefully constructed pathological instances. These results suggest (along with empirical evidence) that HB or NAG’s practical performance gains are a by-product of minibatching. Furthermore, this work provides a viable (and provable) alternative, which, on the same set of problem instances, significantly improves over HB, NAG, and SGD’s performance. This algorithm, denoted as ASGD, is a simple to implement stochastic algorithm, based on a relatively less popular version of Nesterov’s AGD. Extensive empirical results in this paper show that ASGD has performance gains over HB, NAG, and SGD.
منابع مشابه
On the insufficiency of existing momentum schemes for Stochastic Optimization
Momentum based stochastic gradient methods such as heavy ball (HB) and Nesterov’s accelerated gradient descent (NAG) method are widely used in practice for training deep networks and other supervised learning models, as they often provide significant improvements over stochastic gradient descent (SGD). Rigorously speaking, “fast gradient” methods have provable improvements over gradient descent...
متن کاملOn the Insufficiency of Existing Momentum Schemes for Stochastic Optimization
Momentum based stochastic gradient methods such as heavy ball (HB) and Nesterov’s accelerated gradient descent (NAG) method are widely used in practice for training deep networks and other supervised learning models, as they often provide significant improvements over stochastic gradient descent (SGD). In general, “fast gradient” methods have provable improvements over gradient descent only for...
متن کاملOn the Insufficiency of Existing Momentum Schemes for Stochastic Optimization
Momentum based stochastic gradient methods such as heavy ball (HB) and Nesterov’s accelerated gradient descent (NAG) method are widely used in practice for training deep networks and other supervised learning models, as they often provide significant improvements over stochastic gradient descent (SGD). Theoretically, these “fast gradient” methods have provable improvements over gradient descent...
متن کاملAPPROXIMATION OF STOCHASTIC PARABOLIC DIFFERENTIAL EQUATIONS WITH TWO DIFFERENT FINITE DIFFERENCE SCHEMES
We focus on the use of two stable and accurate explicit finite difference schemes in order to approximate the solution of stochastic partial differential equations of It¨o type, in particular, parabolic equations. The main properties of these deterministic difference methods, i.e., convergence, consistency, and stability, are separately developed for the stochastic cases.
متن کاملHYBRID COLLIDING BODIES OPTIMIZATION AND SINE COSINE ALGORITHM FOR OPTIMUM DESIGN OF STRUCTURES
Colliding Bodies Optimization (CBO) is a population-based metaheuristic algorithm that complies physics laws of momentum and energy. Due to the stagnation susceptibility of CBO by premature convergence and falling into local optima, some meritorious methodologies based on Sine Cosine Algorithm and a mutation operator were considered to mitigate the shortcomings mentioned earlier. Sine Cosine Al...
متن کامل